翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Concurrent computation : ウィキペディア英語版
Concurrent computing

Concurrent computing is a form of computing in which several computations are executing during overlapping time periods—''concurrently''—instead of ''sequentially'' (one completing before the next starts). This is a property of a system—this may be an individual program, a computer, or a network—and there is a separate execution point or "thread of control" for each computation ("process"). A ''concurrent system'' is one where a computation can make progress without waiting for all other computations to complete—where more than one computation can make progress at "the same time".〔''Operating System Concepts'' 9th edition, Abraham Silberschatz. "Chapter 4: Threads"〕
As a programming paradigm, concurrent computing is a form of modular programming, namely factoring an overall computation into subcomputations that may be executed concurrently. Pioneers in the field of concurrent computing include Edsger Dijkstra, Per Brinch Hansen, and C.A.R. Hoare.
==Introduction==

Concurrent computing is related to but distinct from parallel computing, though these concepts are frequently confused,〔"Concurrency is not Parallelism", ''Waza conference'' Jan 11, 2012, Rob Pike ((slides )) ((video ))〕〔(【引用サイトリンク】title=Parallelism vs. Concurrency )〕 and both can be described as "multiple processes executing ''during the same period of time''". In parallel computing, execution literally occurs at the same instant, for example on separate processors of a multi-processor machine, with the goal of speeding up computations—parallel computing is impossible on a (single-core) single processor, as only one computation can occur at any instant (during any single clock cycle). By contrast, concurrent computing consists of process ''lifetimes'' overlapping, but execution need not happen at the same instant. The goal here is to model processes in the outside world that happen concurrently, such as multiple clients accessing a server at the same time. Structuring software systems as composed of multiple concurrent, communicating parts can be useful for tackling complexity, regardless of whether the parts can be executed in parallel.
For example, concurrent processes can be executed on a single core by interleaving the execution steps of each process via time slices: only one process runs at a time, and if it does not complete during its time slice, it is ''paused,'' another process begins or resumes, and then later the original process is resumed. In this way multiple processes are part-way through execution at a single instant, but only one process is being executed at that instant.
Concurrent computations ''may'' be executed in parallel,〔 for example by assigning each process to a separate processor or processor core, or distributing a computation across a network, but in general, the languages, tools and techniques for parallel programming may not be suitable for concurrent programming, and vice versa.
The exact timing of when tasks in a concurrent system are executed depend on the scheduling, and tasks need not always be executed concurrently. For example, given two tasks, T1 and T2:
* T1 may be executed and finished before T2 or ''vice versa'' (serial ''and'' sequential);
* T1 and T2 may be executed alternately (serial ''and'' concurrent);
* T1 and T2 may be executed simultaneously at the same instant of time (parallel ''and'' concurrent).
The word "sequential" is used as an antonym for both "concurrent" and "parallel"; when these are explicitly distinguished, ''concurrent/sequential'' and ''parallel/serial'' are used as opposing pairs. A schedule in which tasks execute one at a time (serially, no parallelism), without interleaving (sequentually, no concurrency: no task begins until the previous task ends) is called a ''serial schedule''. A set of tasks that can be scheduled serially is ''serializable'', which simplifies concurrency control.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Concurrent computing」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.